Preserving Intermediate Objectives: One Simple Trick to Improve Learning for Hierarchical Models
نویسندگان
چکیده
Hierarchical models are utilized in a wide variety of problems which are characterized by task hierarchies, where predictions on smaller subtasks are useful for trying to predict a final task. Typically, neural networks are first trained for the subtasks, and the predictions of these networks are subsequently used as additional features when training a model and doing inference for a final task. In this work, we focus on improving learning for such hierarchical models and demonstrate our method on the task of speaker trait prediction. Speaker trait prediction aims to computationally identify which personality traits a speaker might be perceived to have, and has been of great interest to both the Artificial Intelligence and Social Science communities. Persuasiveness prediction in particular has been of interest, as persuasive speakers have a large amount of influence on our thoughts, opinions and beliefs. In this work, we examine how leveraging the relationship between related speaker traits in a hierarchical structure can help improve our ability to predict how persuasive a speaker is. We present a novel algorithm that allows us to backpropagate through this hierarchy. This hierarchical model achieves a 25% relative error reduction in classification accuracy over current state-of-the art methods on the publicly available POM dataset.
منابع مشابه
A Geometry Preserving Kernel over Riemannian Manifolds
Abstract- Kernel trick and projection to tangent spaces are two choices for linearizing the data points lying on Riemannian manifolds. These approaches are used to provide the prerequisites for applying standard machine learning methods on Riemannian manifolds. Classical kernels implicitly project data to high dimensional feature space without considering the intrinsic geometry of data points. ...
متن کاملView-Tolerant Face Recognition and Hebbian Learning Imply Mirror-Symmetric Neural Tuning to Head Orientation
The primate brain contains a hierarchy of visual areas, dubbed the ventral stream, which rapidly computes object representations that are both specific for object identity and robust against identity-preserving transformations, like depth rotations [1, 2]. Current computational models of object recognition, including recent deep-learning networks, generate these properties through a hierarchy o...
متن کاملHierarchical Multi-label Classification using Fully Associative Ensemble Learning
Traditional flat classification methods ( e.g. , binary or multi-class classification) neglect the structural information between different classes. In contrast, Hierarchical Multi-label Classification (HMC) considers the structural information embedded in the class hierarchy, and uses it to improve classification performance. In this paper, we propose a local hierarchical ensemble framework fo...
متن کاملSurvey of Learning Models in Medical Students of Rafsanjan University of Medical Sciences in 2019: A Descriptive Study
Background and Objectives: All human progress and ascent is somehow related to his learning. One of the factors affecting learning is learning style. By knowing the learning styles of students, it is possible to provide teaching appropriate to their individual style. The aim of this study was to determine the learning styles of medical students of Rafsanjan University of Medical Sciences in 201...
متن کاملLinguistic modeling by hierarchical systems of linguistic rules
In this paper, we are going to propose an approach to design linguistic models which are accurate to a high degree and may be suitably interpreted. This approach will be based on the development of a Hierarchical System of Linguistic Rules learning methodology. This methodology has been thought as a refinement of simple linguistic models which, preserving their descriptive power, introduces sma...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1706.07867 شماره
صفحات -
تاریخ انتشار 2017